meet wu dao 2
GPT-3 Scared You? Meet Wu Dao 2.0: A Monster of 1.75 Trillion Parameters
Jack Clark, OpenAI's policy director, calls this trend of copying GPT-3, "model diffusion." Yet, among all the copies, Wu Dao 2.0 holds the record of being the largest of all with a striking 1.75 trillion parameters (10x GPT-3). Coco Feng reported for South China Morning Post that Wu Dao 2.0 was trained on 4.9TB of high-quality text and image data, which makes GPT-3's training dataset (570GB) pale in comparison. Yet, it's worth noting OpenAI researchers curated 45TB of data to extract clean those 570GB. It can learn from text and images and tackle tasks that include both types of data (something GPT-3 can't do).
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.50)
Meet Wu Dao 2.0, the Chinese AI model making the West sweat
A new artificial intelligence model developed by Chinese researchers is performing untold feats with image creation and natural language processing -- making rivals in Europe and the U.S. nervous about falling behind. The model, dubbed Wu Dao 2.0, is able to understand everything people say -- the grammar too -- but can also recognize images and generate realistic pictures based on descriptions. It can also write essays and poems in traditional Chinese, as well as predict the 3D structures of proteins, POLITICO'S AI: Decoded reported. Developed by the government-funded Beijing Academy of Artificial Intelligence and unveiled last week, Wu Dao 2.0 appears to be among the world's most sophisticated AI language models. Wu Dao 2.0's creators say it's 10 times more powerful than its closest rival GPT-3, developed by the U.S. firm OpenAI.
- North America > United States (0.31)
- Asia > China > Beijing > Beijing (0.25)
- Europe > Germany (0.06)
- (7 more...)